AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-task fine-tuning optimization

# Multi-task fine-tuning optimization

Mistral Ft Optimized 1227
Apache-2.0
This is a hierarchical SLERP merged model based on multiple excellent open-source models (including OpenHermes-2.5, neural-chat-7b, MetaMath-Mistral-7B, and openchat-3.5), designed to serve as a powerful foundation model for various downstream tasks.
Large Language Model Transformers
M
OpenPipe
2,883
82
Baby Llama 58m
The Baby Llama model is a language model with 58 million parameters, distilled from LLaMA and GPT2, and designed specifically for the Small Language Model Challenge.
Large Language Model Transformers English
B
timinar
442
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase